Skip to content

Conversation

@sorenmacbeth
Copy link
Contributor

@sorenmacbeth sorenmacbeth commented Feb 20, 2025

Add lr_scheduler_interval field to OptimizerConfig to allow setting of interval to step for OneCycleLR


📚 Documentation preview 📚: https://pytorch-tabular--545.org.readthedocs.build/en/545/

numpy defaults to numpy.float64 when they should be numpy.float32

This caused training to fail on MPS devices but it works on my M1 with
this.
@dosubot dosubot bot added size:M This PR changes 30-99 lines, ignoring generated files. size:S This PR changes 10-29 lines, ignoring generated files. enhancement New feature or request and removed size:M This PR changes 30-99 lines, ignoring generated files. labels Feb 20, 2025
@sorenmacbeth sorenmacbeth force-pushed the optimizer-lr-scheduler-interval branch from a285eb5 to a196f80 Compare February 20, 2025 05:34
@dosubot dosubot bot added size:M This PR changes 30-99 lines, ignoring generated files. and removed size:S This PR changes 10-29 lines, ignoring generated files. labels Feb 20, 2025
@dosubot dosubot bot added size:S This PR changes 10-29 lines, ignoring generated files. and removed size:M This PR changes 30-99 lines, ignoring generated files. labels Mar 31, 2025
@manujosephv manujosephv merged commit 7970abf into pytorch-tabular:main Apr 8, 2025
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request size:S This PR changes 10-29 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants